The Bias Cut: Toward a Technopoetics of Algorithmic Systems.

Citation metadata

Date: Fall 2022
Publisher: Editorial Board of Catalyst: Feminism, Theory, Technoscience
Document Type: Article
Length: 8,086 words

Document controls

Main content

Abstract: 

This essay explores a material engagement with discourses of bias. At a time when the developers of algorithmic systems are exploring concepts of bias like never before, textile bias (or the skew of woven material) offers an alternative view into the scripts of computational engagement. To probe this potential, this essay engages a range of feminist and anti-racist interventions in performance arts, critical archival studies, and my own pedagogical collaborations. With these experiments, I ask, how might material bias inform ongoing analysis of cultural bias within machine learning systems? The experiments reveal interwoven dynamics of power, labor, and historicity with particular attention to complicity and change. Through angular encounters with bias, I explore the development of an emerging technopoetics of algorithmic systems.

Keywords

algorithmic systems, bias, machine learning, materiality, performance arts, textile arts

Full Text: 
If it is the case that inequity and injustice [are] woven into the very 
fabric of our societies, then that means each twist, coil, and code is 
a chance for us to weave new patterns, practices, and politics. The 
vastness of the problem will be its undoing once we accept that we are 
pattern makers.
--Ruha Benjamin, in Khari Johnson (2020)

In the spring of 2014, Sudanese Australian author and mechanical engineer Yassmin Abdel-Magied (2014) gives a TEDx talk near Melbourne, Australia. She begins with a striking question: "Someone who looks like me walks past you in the street--do you think they're a mother, a refugee, or a victim of oppression--or do you think they're a cardiologist, a barrister, or maybe your local politician?"

Abdel-Magied is speaking about bias. She's wearing a copper and coral colored hijab that gently drapes around her head and neck, complementing her abaya, a long, embroidered cloak made of stiff fibers. Her point is prescient: "What the world expects of me and the way I'm treated depends on the arrangement of this piece of cloth... This is about looking beyond your bias."

For Abdel-Magied, as for many, bias refers to a habit of mind. Bias is a "tendency to favour or dislike a person or thing, especially as a result of a preconceived opinion" (OED Online 2021). A stereotype about someone based on characteristics such as race, gender, and disability (Blodgett et al. 2020). In the TEDx theater, bias describes the way Abdel-Magied's audience--many of whom likely identify as white and Judeo-Christian--see the piece of cloth and make negative assumptions about her as a Muslim woman of color. The cloth exposes the audience's bias.

What Abdel-Magied doesn't mention is that the very arrangement of cloth not only reveals bias, it is bias--at least of a certain kind. Her cloth drapes with the help of the textile grain, the strands of thread that traverse the body not on the up-and-down but along the slanted angle. From this etymology, bias refers to "a slanting or sloping line, a diagonal" and "a line running diagonally or at an angle to the warp or weft of a woven fabric" (OED Online 2021). To a tailor, bias refers to that powerful tilt. A fabric made from interlocking threads that run horizontally and vertically turn on the diagonal. To "cut on the bias" is to intentionally skew woven material, allowing gravity to do its work. It is to push against the grain, to disrupt a binary of warp and weft. The interruption of Euclidean geometry creates a soft and fitting drape. Working on the bias makes any number of things possible, from headscarves to cloth diapers, to haute couture.

Abdel-Magied is speaking a few years before what has seemed like a clarion call to fix technology bias. Search algorithms that harmfully equate dark skin with wrongdoing; facial recognition systems that misgender people at airports; motion-tracking software that monitor employee cleanliness (Browne 2019; Costanza-Chock 2020; Eubanks 2018; Fox et al. 2019, Noble 2018). Think of surveillance scholar Simone Browne's (2019, 115) analysis of wrongful algorithmic determinations, such as the fingerprint readers used to establish Muslim American Brandon Mayfield as a suspect in the 2004 Madrid, Spain, train bombings. By turning a body into data, such techniques reinforce "a logic of prototypical whiteness" (Browne 2019, 162) that disciplines minoritarian subjects and renders fluid and intersectional identities unseen (Benjamin 2019; Bennett et al. 2021; Bosley et al. 2022; Browne 2019, 114). Back in the TEDx theater, the cultural bias that disproportionally affects people along existing lines of inequity (race, class, gender, disability) is on some minds, but it hasn't fully entered the public consciousness when it comes to technology. (Unless you see textiles as a kind of technology, too.)

This essay explores the associations between fiber and computational development that animate alternative scripts of technology development, and specifically the possibility of an emerging technopoetics of algorithmic systems. Reading across the bias of historian Marisa J. Fuentes (2017) and English scholar Linda Brodkey (1994), I put the cuts, orientations, and angles of technoscience and archival and performance theory in conversation (Adeyemi 2019; Ahmed 2006a, 2006b, 2010; Barad 2008; Judd 2019). Informed by this analysis, I turn to a class I designed with my colleagues Afroditi Psarra and Gabrielle Benabdallah at the University of Washington where we begin with a piece of cloth (Psarra, Rosner, and Benabdallah 2021). (1) Titling our course "On the Bias," our first exercise asks students to create a dataset by cutting the fabric on the bias. Some follow the woven pattern, finding a visual rhythm with each cut. Others focus on their scissors, observing the dull edges and clunky lines. Still others notice their own competencies and inexperience. By materializing their data collection process, students pay attention to the materials, tools, and skillsets that come to make matter matter. They use fabric to distill multiple and overlapping influences on each cut--encountering datasets as made and not found. With these bias studies, I explore the aesthetic possibility of alternative data practices and performances, sketching the outlines of an algorithmic technopoetics. (2)

Textiles, Technology, and Liberatory Potential

The links between fiber and bits run deep. As scholars of media and materiality remind, the work of encoding information can be traced to ancient entanglements of string (Ascher 1983; Ingold 2012; Johnson & Joyce 2022; Kuchler and Carroll 2020; Monteiro 2017; Wernimont 2019). Consider the knot work of Andean quipus. In some six hundred pre-Columbian quipus that survived Spanish colonization, small knots put into colorful threads signify categories of knowledge such as calendrical data, tax assessments, and census records (Ascher 1983; de la Puente 2019; Urton and Chu 2019). Through variations in length, string color, fastening, and pattern, the wearer can put their hands over the knots and interpret the encoded information, becoming a tactile decoder by touch (Gschwandtner 2008,275; Moten 2003, 53). (3)

Centuries later, the field of computing inherited this tactile legacy (Monteiro 2017; Wernimont 2019). When British Countess Ada Lovelace worked on the Analytic Engine, what is widely considered the first computer, she gestured at the connection with string (Monteiro 2017). "We may say most aptly that the Analytical Engine weaves algebraic patterns just as the Jacquard loom weaves flowers and leaves," she wrote in 1843 (Menabrea 1843, 696; Monteiro 2017). Factories throughout nineteenth-century Great Britain installed the Jacquard loom to speed up the delivery of everyday textile objects from curtains to scarves. The loom ran on a system of punch cards. The position and frequency of holes punched into small pieces of cardstock directed the movement of threads on the loom, producing the same variety of complex designs as the early computers. Ada Lovelace saw the alliance: a computer as weaving machine.

This connection undergirded patterns of exploitation within the early decades of the computing industry, compelling profits from gendered and racialized subjects (Amrute 2016; Hicks 2017; Nakamura 2014, Monteiro 2017; Wernimont 2019). Navajo women in Shiprock, New Mexico, manufactured chips (Nakamura 2014), and operators in Waltham, Massachusetts, wired the code that sent the Apollo spacecraft to the moon (Rosner 2018; Rosner et al. 2018; Shorey and Rosner 2019). In each case, stereotypes associated with "nibble fingers" and textile-propensities justified the extraction of labor from minoritarian workers (Nakamura 2014).

From the knot-work of Andean quipus and sewn British book bindings, to punch-card looms and biosensor meshes, material instantiates the difference that makes a difference (Appadurai and Alexander 2020; Bateson 1972). Wires that work as threads; units of digital memory that fill patchwork quilts; techniques of embroidery that encode the path to the moon and back. It doesn't take a giant imaginative leap to see how the chips of computers have grain, pattern, and thread count (Montero 2017)--and how cutting along the bias can produce something wildly new.

This radical possibility reflects compelling arguments made by scholars who link textiles with histories of activism and abolition (Arellano 2022; Brown 1989; Morrison 2018, Peres-Bustos et al. 2019; Smith 2019). Oral history interviews document the use of quilting patterns to encode navigation for people escaping enslavement through the Underground Railroad (Tobin and Dobard 1999). Using fiber to convene community, a range of artists and activists have reckoned with gendered and racial violence (Freeby 2020; Kendzior 2015; Sutton and Vacarezza 2022) to reflect on the subtleties of reclaiming material relationships to safety and home (Koiki n.d.). Century-old photographs of activist and abolitionist Sojourner Truth knitting expose a tactical complexity. In one portrait, Truth sits with yarn and knitting needles in hand, a pose she might have used to hide a disability in her right hand (Minister 2012). Looking back on the image, scholars take care to understand the situated and partially hidden significance of stitch work (Cutter 2020)--exploring to what extent a liberatory potential might be found from reading into fiber. (4)

From Cultural to Material Bias

As an increasingly prominent context for discussing bias, machine learning systems exemplify a formulation of cultural bias rooted in individual prejudice and premised on its identification and removal (Park et al. 2019). This treatment locates bias within a binary logic (bias/debias) wherein harmful assumptions and decisions get baked into an entity (a person, thing, or system) (Benjamin 2019, Eubanks 2018; Noble 2018). For example, technology scholar Ruha Benjamin (2019) describes the use of algorithms to "correct" bias within marketing campaigns and usability testing schemes. But when the algorithms get used to reinforce existing power relationships--such as the devaluing of Black users in the determination of accents recognized by the Apple voice assistant Siri--those same systems raise important questions around heightened discrimination: "who ultimately profits from the proliferation of ethnically tailored marketing?" (Benjamin 2019, 15). Departing from an individuated treatment, technology scholars such as Benjamin read bias as a question of situated knowledge and positioning, a condition relevant to all interpersonal, structural, and institutional activity, including decision making within technology corporations (Benjamin 2019; Hoffmann 2021; Lee 2018). From this view, cultural bias names the vantage point encoded in software and shaping technological scripts, including who is seen, who sees, and to what end (Benjamin 2019; Browne 2019).

Compared with cultural bias, material bias carries a more procedural meaning. While working with woven fabric, the bias refers to the directional composition of fabric, or the orientation of yarns in the textile (OED Online 2021). Most woven textiles are made of interlocking horizontal (weft) and vertical (warp) threads, named lengthwise and crosswise grains. The bias grain runs across those orthogonal yarns. In sewing patterns, the term true bias refers to a 45[degrees] angle from the woven fabric edge or selvage (the finished edge that prevents the textile from unravelling) (OED Online 2021).

Handle woven fabric, and you'll learn about bias. Pull on the lengthwise grain, and the fabric keep its form; depending on the fiber, it might not move at all (a cotton textile barely stretches, for example). Pull on the crosswise grain, and the fabric exhibits a bit more ply. But turn the fabric at 45[degrees] and pull, and you'll notice a significant spring. The fabric gains elasticity. Thanks to this stretch and flexibility, tailors insert strips of bias-cut fabric along the edge of garments, quilts, or bibs. Covering armholes and necklines with a bias strip (also called bias tape) allows the tailor to finish rough edges, bind hems, and give the perimeter new strength. (5)

Compared with techniques of spinning or weaving, structures like bias grain have figured less prominently in critical textiles scholarship. When it does appear, the analysis of bias grain tends to focus on the work of Progressive-Era fashion icon Madeleine Vionnet, who popularized the bias-cut technique featured on gowns worn by Hollywood stars such as Greta Garbo and Marlene Dietrich (Di Trocchio 2011). (6) But bias-cut applications have long permeated textile techniques, from the bias plaiting basketry of Coast Salish weavers to the tightly fitting churidar pyjama pants worn across South Asia. (7) In the development of an early form of digital information storage called core memory, embroidering a wire along the diagonal grid of physical "bits" encoding information made that information not only readable but also rewritable by a computer program (Rosner et al. 2018, Shorey and Rosner 2019). Today medical robotics researchers use the bias cut to develop artificial muscles (Naclerio and Hawkes 2020) and automotive engineers use the bias cut to reduce leaks in automatic transmissions (Kuroki and Sowa 1991). (8)

Stretching its potential, Brodkey's (1994) theory of writing on the bias (1994) and Fuentes's (2016) method of reading along the bias grain bring new theoretical range to this material machinery. (9) In an unusual autobiographical essay on learning to write, Brodkey uses textile bias to critique a masculinized plug-and-play approach to literacy discourse, exposing its feigned objectivity and strict application of rules. She calls for a feminized learning-by-doing approach, a form of writing practice she names (after her mother's scrupulous eye for a garment's detail) writing on the bias. Bias allows for a three-dimensional thinking that gives fabric structure and purpose just as it gives the writing significance and meaning. "Without a bias," she notes, "language is only words as cloth is only threads" (1994, 546).

Where writing on the bias lends structure and purpose, reading on the bias brings elasticity to interpretation. In her reckoning with gaps and silences in colonial Bridgetown archives, Barbados, Fuentes "expands the legibility of these archival documents" (2016, 78) by considering residual traces of enslaved women's presence. Reading the case law, for example, she uses archival gaps around the gendered camouflage of an unnamed enslaved boy disguised as a Black woman to expose differential gendered and sexual assumptions for enslaved and white women, showing how experiences that are conspicuously absent from traditional historical archives still shape analysis (Hartman 2008). "Reading along the bias grain stretches the archive to accentuate the presence of enslaved women when not explicitly mentioned in certain documents," Fuentes observes (2016, 156). (10)

Materiality Askew

These reflexive potentials within material bias (textile or otherwise) complement a wider scholarship on materialism, a large and scattered body of work concerned with what material worlds open and constrain (Braidotti 2013; Minh-Ha 2014; TallBear 2017). Reviewing this scholarship across the past century, anthropologist Eric Plemons (2013) identifies three major periods: the 1920s attention to the conditions of the human brain with ideas of consciousness and will; the 1970s development of the sociology of scientific knowledge and its focus on embodiment; and the 2000s concern for the agential, consequential, and non-innocent character of matter. In this latter focus, scholars narrow their analysis on the kinds of conceptualizations the material offers for theorizing social and political life, and particularly the perspectives that language (or discourse) elides. This new model of the world, often rooted in the natural sciences, manifests in the influence of biological thinking on the work of Donna Haraway and theoretical physics on the work of Karen Barad. Barad's (2008) agentic readings of matter emerge from a revisiting of Niels Bohr's "philosophy physics" (xi) wherein physical models change foundational understandings of the social, asking: how does "matter comes to matter"? (244) Barad locates this question in a particular understanding of agency as a forceful and situated condition that prompts analysists to follow the agential cut. This cut describes an act of differentiation, a perspective taken when something is foregrounded and something else backgrounded, when one agential force is separated from another.

If the agentic cut marks a material differentiation, then the bias cut--that separation of fabric along the diagonal--might speak to a tactical making of difference, a kind of agential cut with angular repercussions, a mingling of orientations (overt and covert, direct and indirect, aesthetic and utilitarian). In her essays on orientation, Sara Ahmed (2006a, 2006b, 2010) considers this perspective taking through the framework of queer phenomenology. "What does it mean to be oriented?" she asks. "What difference does it make what we are oriented toward?" (2006a, 543). Grounding her inquiry in the phrase sexual orientation, Ahmed considers what is queer in phenomenology, finding that queerness offers a starting point for considering objects that "orient" (in) phenomenological traditions (2006a, 544). Orientation for Ahmed refers to the "effect of what we tend toward" (2010, 247), that which shapes and ingrains a body's capacities to see and comprehend. Using the example of a table, she illustrates the kinds of labors that fall to the background during use: the hands that brought that table into the room and scuffed it at the door, the histories of development and use that shape matter but often fall to the background. In this reflection, she emphasizes the tendency or habituation of that limited stance. Orientation calls attention to the repeated bodily-shaping acquired, the work necessary to take a perspective on the surrounding world.

Revisiting orientation in a recent discussion of queer use, Ahmed identifies the angularities of design--specifically (via Margaret Price) the winding redirection of access signs that orient wheelchair users through space, creating alternative opportunities for use. "Modifications can also be reorientations... Those who need modifications to enable them to access spaces are directed away from the paths that are busiest, the paths where things usually happen" (Ahmed 2019, 241). She grounds this argument in the phenomenology of Merleau-Ponty and his analysis of vertical perception as well as Hegel's argument that to "stand upright" has become a habit and no longer a willful act. For Ahmed, analyzing queerness as both a designerly and physiological accomplishment works to generate new readings of use, including the use of computational developments (2019, 241). In this formulation, being upright or straight identifies a particular accomplishment whereas being topsy-turvy or weird marks a point of departure from the norm and a possible site of creativity.

Finding creative potential within a world premised on straightness involves locating, experimenting with, and reworking orientation--or what performance theorist Kemi Adeyemi (2019) might call the lean. In her close readings of Black queer performance, such as Rashaad Newsome's Shade Compositions, she connects the angle of performers' bodies on stage with a refusal of masculinist, ableist, and racist vantage points. "I am thinking about the literal ground a lot lately because I have spent far too much time watching people like [Walter] Scott being ground into it," she writes of the everyday brutality of racist police violence (2019, 2). If the horizontal adjudicates carceral enforcement for Black and Indigenous people, the vertical delimits ideas of "good" posture, "up-right-ness," "standing tall." Adeyemi roots the dominance of the 90[degrees] angle in Judeo-Christian beliefs such as man's fall from the heavens and "circumscribed by masculinist investments in the phallus as that which indexes one's abilities to access these planes in the first place, and dependent upon the violent othering of disabilities that would make it difficult if not impossible to physically hold one's whole body upright" (2). By contrast, the lean refers to the "onto-kinetic mechanism" (6) of a body that transcends a 90[degrees] tethering to the ground. Reading the leaning bodies of the twenty-one Black women and gender-nonconforming people in Shade Compositions, Adeyemi calls for "do[ing] away with up down and begin[ing] from other angles to do and theorize politics, black and otherwise" (6). (11)

To find this onto-kinetic mechanism within machine learning developments is to uproot the "imperative of verticality" (2019, 4), to use Adeyemi's term. It is to call to mind what interdisciplinary writer, artist, and performer Bettina Judd has called a "irreducible unknown" (2019, 143) in her discussion of poet Lucille Clifton's automatic and spirit writing, the psychic ability to unconsciously produce written words. Judd's work offers a particularly profound and nuanced practice for using machine learning systems for poetic introspection. To explore the unknown, she sets up an AI translation experiment where she speaks in tongues, transcribes her speech, and then feeds the text to Google translate to see if "the software could decipher meaning. It did not, and in some sense, it did. It deciphered a meaning in my revisiting its translations. More meaning than I could cognate with the text of the tongues itself. The gift of tongues is not about meaning, but experience. I cannot translate the totality of that experience for you" (144).

In her phased process of translation--inserting the transcribed text, requesting a new language translation, composing a poem, repeating the language translation--Judd skews the interaction. She does not interact with the machine through the prescribed angle of a transaction (input text, receive result). She reorients her use toward a spiritual residue (a transcendence of queer use, to use Ahmed's term). Reflecting on her Baptist and Pentecostal upbringing, she describes the practice of speaking in tongues as both alluring and frightening. "What if the spirit didn't decide to take over me?... Was I considered less saved?" (Judd 2019, 143). In this oblique reading of an algorithm, she revisits the elusive--a transformation that might dismiss, but might also pull her in.

Since learning of this project, the experiment it sets up has captivated me. Rather than identify and remove bias within natural language processing or disavow a platform like Google Translate, Bettina Judd comes to complicate and inspirit, to re-embody (Judd 2019, 138). She does not deny her entanglement with the sociotechnical, she occupies it.

In the Classroom: Pedagogical Experiments

This question of occupation within machine learning systems informed a set of classroom exercises I co-developed with my colleagues Afroditi Psarra and Gabrielle Benabdallah during the summer of 2021 (Benabdallahet al. 2022). Inspired by textile lineages, we gathered exercises, readings (work by Kemi Adeyemi, Sara Ahmed, Ruha Benjamin, among others), and invited guest lecturers (including visits by transdisciplinary artists Stephanie Dinkins and Lauren Lee McCarthy). We designed the course to prompt students invested in a data science minor to probe the political and aesthetic potential of algorithmic systems.

During our first week of class, when we asked students to create a dataset by cutting cloth on the bias, one student, Esteban Yosef Agosin, recorded the soundscape of his cut. Charting the varied noise frequencies of each snip, he used the recorded audio to notice differences in the dataset: some cuts sounded relatively long and muffled, while others felt quick and discordant, almost grating. "My goal for this dataset would be to measure the stress level when someone is cutting cloth... to speculate if someone is emotionally fine or not," he explained. The recording exposed layers of interpretation to what stress sounds like and how it does or does not get recorded within a dataset--a collection of records not unlike those used to train machine learning algorithms using speech data and posttraumatic stress disorder classification. Reflecting on the bias of codification, he explored the nuances of making stress newly accountable to the senses.

For the final exhibition, we invited students to start from a dataset and create a new or reinvented artifact. Doctoral student Chari Glogovac-Smith created a series of images that encoded Audrey Lord's poems in algorithmically generated photographs of braided Black hair. To create the images, they trained machine learning models on images of braided Black hairstyles to produce synthetic but realistic renderings. (12) Exploring the variety and angularity of techniques in the resulting compositions, they noticed a set of unique but uniform patterns, much like a language. Leveraging this patterning, they treated the braided images as a text and created an algorithmic map that turned each phrase into a unique hairstyle. With this imagery they describe "holding space for the dichotomy between modern society's demonization of black hair expression and its simultaneous appropriation of the same expression." Treating the impossible geometries of algorithmically generated imagery as language, they reinvent a covert "art of black hair."

Undergraduate student Kathryn Reyes began with images of Binakul, an Indigenous weaving pattern from the Philippines' Ilocos and Abra regions, where her family is from. The pattern uses repetition and curvature to warp the geometries of a 90[degrees] grid. Contemporary comparisons to 1960s op art expose the dominance of Western weaving ontologies amid longer Indigenous textile lineages threatened by dwindling access to natural resources and scarce documentation. Training a machine learning model on the textile images, Reyes produced a set of algorithmically generated Binakul imagery and developed a short film that animates between them. Projecting the film alongside a description of one of the few surviving prewar master weavers working with Binakul, Reyes explains that the project highlights "an absence of tangibility" made visible through "an institutional effort to sustain a dying art."

Across the bias cuts, braided poetry, and Binakul patterns, the On the Bias projects share a certain sensibility. They do not remove harmful forms of machine learning bias. Instead, they infuse neural networks with a fictive force. In their engagement with questions of bias, reflecting on their own encounters and associations, they reimagine the material world and its modes of conditioning.

Exercises like these amount to a multi-threaded exploration. Rather than begin with a future, they look forward within the present--beginning from experiences in the here and now (Dinkins n.d.). Rather than seek completion, they embrace incompletion--orienting toward the unfinished and unknown (Harney and Moten 2020). Rather than put the audience in someone else's place, they render the audience adjacent and interconnected--seeking opportunities to be, feel, and care alongside one another (Campt 2021). And rather than call for agency, they inspire collective transcendence--activating our roles as coconspirators (Gibson forthcoming). In these four maneuvers--from there to here, from resolution to incompletion,from displacement to adjacency, from possession to accomplice--students in the classroom explore how a technopoetics of algorithmic systems might help remake our "biased" lives.

Working against the Grain

Technology cultures often locate bias within a person (Blodgett et al. 2020). When designers propagate their individual judgments across massive datasets and machine learning systems, they amplify those judgments and their potential harms (Amrute 2022; Noble 2021). But our pedagogical experiments organize bias as a particular opportunity. They render an interior judgment an anterior event: never fully determining the phenomena judged (Hoffmann 2021). Those lines of contact make possible a range of phenomena, from tech policy decisions to fintech privacy features. Bringing a material imagination to our contact with bias, and one another, can help rebuild not only our technological worlds but our roles within them.

To recall the theater within which Yassmin Abdel-Magied presents is to see a similar potential. The event solicits reflection but gives little space for criticism or response (Giridharadas 2019). It promotes individual updates over structural transformations. It feeds TEDx branding across a collection of websites and organizational hubs. I first found the video via the Lean In organization, an outgrowth of a book of the same name written by Facebook's Sheryl Sandberg. Each year, LEAN IN partners with McKinsey & Co. to study the state of women in the workforce, the website declares (Lean In n.d.). But to lean is not to lean in, Anna Lauren Hoffmann reminds. (13) Where Sandberg's lean in positions the (white feminist) subject as self-empowering, goal-directed, and affirming a neoliberal regime, Adeyemi's lean recognizes the (minoritarian) subject as suspended within and reckoning with the violence of neoliberal governance. Leaning, in Sandberg's prose, does not map to the psychic or political possibilities of the lean that Adeyemi describes.

On the TEDx stage, fabric tells a particular story about bias. That something material (a hijab) shapes something cultural (a negative assumption about someone else). That what someone uses (wears, inhabits, works with) prompts a particular way of seeing the world and one another. But also, that a treatment of bias is not to remove it. It is to use it: to collectively work against the grain, to skew our relationships with its everyday use, and to turn its conditions into an invitation for change. The material imagination of textile bias takes our understanding beyond recognition. It shows how within the conventional problem solved, the issue resolved, and the harm redressed, the angle of our engagement matters.

Amid our skewed orientations, what we find is not an answer. No solution or ending. Nothing that we can turn into a commodity, apply to our wounds, or move from place to place. What we find is a type of algorithmic technopoetics, an experimental praxis that begins with the angularities of fabric. Fabric is a material gathering of strings, of different lines and timelines, and with them, a convening of different histories, social relationships, and possibilities within our collective present (Brodkey 1994). The student work in the classroom drew on these angularities to bring together multiple hands, histories, layers of fiber and meaning. In this gathering, they suggest a certain aesthetic of communality. By paying attention to the assumptions built into machine learning systems, they formulate bias as a common performance--showing what bodies and algorithms produce in tandem.

An algorithmic technopoetics grounded in angular praxes might reveal that what we think of as technology biases--those harms amplified by bits and code (Friedman and Nissenbaum 1996)--are also and fundamentally about relationships. They help us see and understand our proximities. They shape how we come into relations of connection amid difference, and what those relationships mean within an ever-changing and discordant landscape. If we think of our visualizations, technologies, and cities as tied to a grid, then to combine material and cultural bias is to work along the diagonal, to reorient our approach to each other in our digital and physical surrounds. It is to practice a collective imagining, a mode of anticipating a world where we can learn to become common to one another (Chambers-Letson 2018, xvii).

The technopoetic encounter described so far opens bias as a means of inquiry. Cutting on the bias invites a practice of reorienting our bodies to our algorithmic surroundings, sensing new technological scripts through alternative angles, and taking stock of what an oblique relationship with algorithmic systems reveals about us and our connections with one another. To reorient, to lean, to skew, to stretch is to bring new readings of material engagements to our cultural ones. It is to see an aesthetic possibility in relationships (individual, collective, institutional, technical) that may feel hopeless but are nonetheless structured with pattern and thread.

Acknowledgments

I am grateful to several colleagues for their generous comments, suggestions, and conversations, including Kemi Adeyemi, Neta Alexander, Gabrielle Benabdallah, Laura Juliana Cortes Rico, Helen Gibson, Anna Lauren Hoffmann, Bettina Judd, Lauren Klein, Beth Kolko, Laura Forlano, Marcel O'Gorman, Romi Ron Morrison, Tania Perez-Bustos, James Pierce, Afroditi Psarra, Jihan Sherman, Jacqueline Wernimont, the New School GIDEST seminar, UW HCDE TAT Lab colleagues, and UW On the Bias students.

Notes

(1) For a visual review of course activities, see collaborative documentation authored with several participants in the "On the Bias" class (Benabdallah et al. 2022; Lustig and Rosner 2022). The course complements thinking on textile bias metaphors beyond artistic and technological inquiry, including writing by Lutz Bornmann (2007) and Carol Tavris (2021).

(2) Here I follow Louis Chude-Sokei's astute definition of technopoetics as a "loose and far from canonical term for those engagements with technology as they manifest in the realm of literary, philosophical, musical, or broader cultural realms" (2015, 11). A review of the rich and varied literature on technopoetics is beyond the scope of this essay but I take particular inspiration from inspired ritualistic research traditions within design and technology arts, including work by Jihan Sherman and colleagues (Sherman et al. 2019).

(3) Hawaiian string figures, or hei, expose a complementary positioning. As the kumu hula, kumuoli, and kumu'olelo Hawai'i scholar Kalani Akana (2012) explains, the movement of strings along hands to create images does more than reference a material world. It also repeats and ritualizes histories of movement, orienting the body toward particular sensitivities and noticings, as well as rehearsing knowledge and narrative wherein patterning works as pneumonic practice. Hands in strings produce the codes of collective memory.

(4) See forthcoming dissertation work by Jihan Sherman. For additional examples of critical artistic inquiry connected with fiber, see work by Hinekura Smith (2019) on Maori weaving practice as a decolonizing research methodology; Sonia C. Arellano (2022) on quilting as feminist inquiry in the representation of migrant deaths; Laura Juliana Cortes Rico, Tania Perez-Bustos and colleagues on textile and engineering connections amid Columbian reconciliation (Rico et al. 2020; Perez-Bustos et al. 2019); Ron Eglash (1999) on African fractal patterns. Within wider artistic technoscientific inquiry traditions, see complementary inventive approaches by Nettrice Gaskins (2014), Kat Jungnickel (2018, 2022), Natalie Loveless (2019), Hannah Star Rogers and colleagues (2021; Rogers 2022).

(5) Adjacent fashion applications include bias-cut ruffles and bias-cut stockings.

(6) Bias grain allowed dressmakers like Madeleine Vionnet to create garments with a snug fit across the midriff and soft drape below the waistline-using the shape-clinging elasticity and drape to emphasize the curves of a body.

(7) Histories of woven cloth tend to begin almost 30,000 years ago with the evidence of fabric imprinted into clay (Barber 1991, 1994, 1999). Archeologists tend to agree that the sophisticated nature of these found techniques suggests a much longer timeline, likely beginning in the Palaeolithic period (Adovasio, Soffer, and Klima 1996), nearly 2.5 million years ago. While histories of woven cloth tend not to focus on grain, those that do exist connect bias with fashion trends, from Indian saris (Varghese and Thilagavathi 2012) to Hollywood art deco gowns (Di Trocchio 2011).

(8) Cutting along the bias grain allows the artificial muscle tube to radially expand with applied pressure (Naclerio and Hawkes 2020) and the rotary shaft's seal rings to expand by sliding circumferentially toward one another (Kuroki and Sowa 1991).

(9) In a footnote, Marisa Fuentes (2016) contrasts reading along the bias grain with Ann Stoler's work with colonial records "along the archival grain" (2010), which places greater emphasis on revealing "weaknesses in colonial power," Fuentes (2016, 153) explains. See also complementary work by Jen Clary-Lemon (2022) on the selvedge and Marcel O'Gorman (2021) on inclinations and Homo inclinus.

(10) Despite its importance across cultural and technological histories, material bias rarely features as a topic of analysis for scholars of technology. Few if any books or stand-alone articles exist on the topic of bias grain. Wikipedia articles on related topics of textile bias and textile grain combine to less than one thousand words (at the time of writing). In its absence, material bias represents both a powerful and underexplored tool--exposing what a concern for skew brings to an understanding of ourselves.

(11) "Hegemony of the vertical," Ann Cooper Albright writes, comes out of "the Christian/capitalist complex that insists what is up is good (stock markets, tall buildings, bank accounts, and other assorted 'fill in the blanks') and what is down is bad" (2017, 64, quoted in Adeyemi 2019, 20).

(12) The machine learning models students used, called generative adversarial networks (GANs), involve two neural networks that compete with one other to become more accurate in their predictions.

(13) Informal conversation with Anna Lauren Hoffmann, July 2021.

References

Abdel-Magied, Yassmin. 2014. "What Does My Headscarf Mean to You?" Filmed May 2014 in Southbank, Australia. TEDx video, 13:53. https://www.ted.com/talks/yassmin_abdel_magied_what_does_my_headscarf_mean_to_you?language=en.

Adeyemi, Kemi. 2019. "Beyond 90[degrees]: The Angularities of Black/Queer/Women/Lean." Women & Performance: A Journal of Feminist Theory 29 (1): 9-24. https://doi-org.proxy.library.upenn.edu/10.1080/0740770X.2019.1571861.

Adovasio, James M., Soffer, Olga and Bohuslav Klima. 1996. "Palaeolithic Fiber Technology: Data from Pavlov I, ca. 26,000 B.P." Antiquity, no. 70, 526-34.

Ahmed, Sara. 2006a. "Orientations: Toward a Queer Phenomenology." GLQ: A Journal of Lesbian and Gay Studies 12 (4): 543-74. https://www-muse-jhu-edu.proxy.library.upenn.edu/article/202832.

--. 2006b. Queer Phenomenology. Durham, NC: Duke University Press.

--. 2010. "Orientations Matter." In New Materialisms: Ontology, Agency, and Politics, edited by Diana Coole and Samantha Frost, 234-57. Durham, NC: Duke University Press.

--. 2019. What's the Use?: On the Uses of Use. Durham, NC: Duke University Press.

Akana, Kalani. 2012. "Hei, Hawaiian String Figures: Hawaiian Memory Culture and Mnemonic Practice." Multi-disciplinary Research on Hawaiian Well-Being, no. 8, 45-70.

Amrute, Sareeta. 2016. Encoding Race, Encoding Class: Indian IT Workers in Berlin. Durham, NC: Duke University Press.

Amrute, Sareeta. 2022. "What the Facebook Files Tell Us about Racial Capitalism." Interactions 29 (2): 59-61. https://doi-org.proxy.library.upenn.edu/10.1145/3511697.

Albright, Ann Cooper. 2017. "The Perverse Satisfaction of Gravity." In The Aging Body in Dance: A Cross-Cultural Perspective, edited by Nanako Nakajima, and Gabriele Brandstetter, 63-73. New York: Routledge.

Appadurai, Arjun, and Neta Alexander. 2020. Failure. Medford: Polity Press.

Arellano, Sonia. 2022. "Quilting as a Qualitative, Feminist Research Method: Expanding Understandings of Migrant Deaths." Rhetoric Review 41(1): 17-30. https://doi-org.proxy.library.upenn.edu/10.1080/07350198.2021.2002058.

Ascher, Marcia. 1983. "The Logical-Numerical System of Inca Quipus." Annals of the History of Computing 5 (3): 268-78. https://doi-org.proxy.library.upenn.edu/10.1109/MAHC.1983.10090.

Barad, Karen. 2007. "Meeting the Universe Halfway." In Meeting the Universe Halfway. Durham, NC: Duke University Press.

Barber, Elizabeth Wayland. 1991. Prehistoric Textiles: The Development of Cloth in the Neolithic and Bronze Ages with Special Reference to the Aegean. Princeton, NJ: Princeton University Press.

--. 1994. Women's Work: The First 20,000 Years: Women, Cloth, and Society in Early Times. New York: W.W. Norton and Company.

--. 1999. The Mummies of Urumchi. New York: W.W. Norton and Company.

Bateson, Gregory. 1972. Steps to an Ecology of Mind. New York: Chandler Publishing Co.

Benabdallah, Gabrielle, Ashten Alexander, Sourojit Ghosh, Chariell Glogovac-Smith, Lacey Jacoby, Caitlin Lustig, Anh Nguyen et al. 2022."Slanted Speculations: Material

Encounters with Algorithmic Bias." In Designing Interactive Systems Conference, 85-99. 2022. https://doi-org.proxy.library.upenn.edu/10.1145/3532106.3533449.

Bennett, Cynthia L., Cole Gleason, Morgan Klaus Scheuerman, Jeffrey P. Bigham, Anhong Guo, and Alexandra To. 2021. "'It's Complicated: Negotiating Accessibility and (Mis)Representation in Image Descriptions of Race, Gender, and Disability." CHI '21 : Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, (May 2021), article no. 375, 1-19. https://doi-org.proxy.library.upenn.edu/10.1145/3411764.3445498.

Blodgett, Su Lin, Solon Barocas, Hal Daume III, and Hanna Wallach. 2020. "Language (Technology) Is Power: A Critical Survey of 'Bias' in NLP." In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 5454-5476. Association for Computational Linguistics. https://doi-org.proxy.library.upenn.edu/10.18653/v1/2020.acl-main.485.

Bornmann, Lutz. 2007. "Bias Cut." Nature 445 (566). https://doi-org.proxy.library.upenn.edu/10.1038/nj7127-566a.

Bosley, Brooke, Takeria Blunt, Jihan Sherman, Brandy Pettijohn, Britney Johnson, Amber G. Johnson, Blaire Bosley, Susana M. Morris, and Andre Brock. 2022. "Bringing Black Feminist's Thoughts, Self-Definitions, and Creative Agency to Digital Media and Technology Design." In Proceedings of the 55th Hawaii International Conference on System Sciences. http://hdl.handle.net.proxy.library.upenn.edu/10125/79680.

Braidotti, Rosi. 2013. Metamorphoses: Towards a Materialist Theory of Becoming. Cambridge, UK: Polity.

Brodkey, Linda. 1994. "Writing on the Bias." College English 56 (5): 527-47. https://www-jstor-org.proxy.library.upenn.edu/stable/378605.

Brown, Elsa Barkley. 1989. "African-American Women's Quilting." Signs: Journal of Women in Culture and Society 14 (4): 921-929. https://doi-org.proxy.library.upenn.edu/10.1086/494553.

Browne, Simone. 2015. Dark matters: On the Surveillance of Blackness. Durham, NC: Duke University Press.

Campt, Tina M. 2021. A Black Gaze: Artists Changing How We See. Cambridge, MA: MIT Press.

Chambers-Letson, Joshua. 2018. After the Party: A Manifesto for Queer of Color Life. New York: NYU Press.

Chude-Sokei, Louis. 2015. The Sound of Culture: Diaspora and Black Technopoetics. Middletown, CT: Wesleyan University Press.

Clary-Lemon, Jennifer. 2022. "Selvedge Rhetorics and Material Memory." Pietho 24(3). Accessed June 27, 2022. https://cfshrc.org/article/selvedge-rhetorics-and-material-memory.

Costanza-Chock, Sasha. 2020. Design Justice: Community-led Practices to Build the Worlds We Need. Cambridge, MA: MIT Press.

Crawford, Kate, and Vladan Joler. 2018. "Anatomy of an AI System." https://anatomyof.ai.

Cutter, Martha J. 2020. "The Fugitive Gazes Back: The Photographic Performance Work of Frederick Douglass and Sojourner Truthc8 (2). https://doi-org.proxy.library.upenn.edu/10.4000/inmedia.2463.

De la Puente, Jose Carlos. 2019. "Calendars in Knotted Cords: New Evidence on How Khipus Captured Time in Nineteenth-Century Cuzco and Beyond." Ethnohistory 66--(3): 437-64. https://doi-org.proxy.library.upenn.edu/10.1215/00141801-7517868.

Dinkins, Stephanie. 2021. "Secret Garden--audio transcripts" (website). Accessed June 27, 2022. https://www.stephaniedinkins.com/sg_texts.html.

Di Trocchio, Paola. 2011. "Exhibition Review: Madeline Vionnet: Fashion Purist--The World According to Madeleine Vionnet." Fashion Theory 15 (4): 517-23. https://doi-org.proxy.library.upenn.edu/10.2752/175174111X13115179150035.

Eglash, Ron. 1999. African Fractals: Modern Computing and Indigenous Design. New Brunswick NJ: Rutgers University Press.

Eubanks, Virginia. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York, NY: St. Martin's Press.

Friedman, Batya, and Helen Nissenbaum. 1996. "Bias in Computer Systems." ACM Transactions on Information Systems 14 (3): 330-347. https://doi-org.proxy.library.upenn.edu/10.1145/230538.230561.

Fox, Sarah E., Kiley Sobel, and Daniela K. Rosner. 2019. "Managerial Visions: Stories of Upgrading and Maintaining the Public Restroom with IoT." In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1-15.

Freeby, Ashley. n.d. Ashley M. Freeby (website). Accessed June 25, 2022. https://www.ashleyfreeby.com.

Fuentes, Marisa J. 2016. Dispossessed Lives: Enslaved Women, Violence, and the Archive. Philadelphia, PA: University of Pennsylvania Press.

Gaskins, Nettrice. 2014. "Techno-Vernacular Creativity, Innovation and Learning in Underrepresented Ethnic Communities of Practice." PhD diss., Georgia Institute of Technology. http://hdl.handle.net.proxy.library.upenn.edu/1853/53163.

Gibson, Helen. Forthcoming. "Teaching Towards Calvin Warren's Nonmetaphysical Historiography," in Philipp Loffler, Nathalie Rauscher, and Welf Werner, eds., American Studies: A Monograph Series. Heidelberg: Universitatsverlag Winter.

Giridharadas, Anand. 2019. Winners Take All: The Elite Charade of Changing the World. New York, NY: Vintage.

Gruwell, Leigh. 2022. Making Matters: Craft, Ethics, and New Materialist Rhetorics. Boulder, CO: University Press of Colorado.

Gschwandtner, Sabrina. 2008. "Knitting Is.... " Journal of Modern Craft 1 (2): 271-78. https://doi-org.proxy.library.upenn.edu/10.2752/174967808X325532.

Harney, Stefano, and Fred Moten. 2021. "All Incomplete." Minor Compositions (blog). June 18, 2021. https://www.minorcompositions.info/?p=1032.

Hartman, Saidiya. 2008. "Venus in Two Acts." Small Axe: A Caribbean Journal of Criticism 12 (2): 1-14. https://doi-org.proxy.library.upenn.edu/10.1215/-12-2-1.

Hicks, Mar. 2017. Programmed Inequality: How Britain Discarded Women Technologists and Lost its Edge in Computing. Cambridge, MA: MIT Press.

Hoffmann, Anna Lauren. 2021. "Even When You Are a Solution You Are a Problem: An Uncomfortable Reflection on Feminist Data Ethics." Global Perspectives 2 (1): 21335. https://doi-org.proxy.library.upenn.edu/10.1525/gp.2021.21335.

Ingold, Tim. 2016. Lines: A Brief History. London: Routledge.

--. 2010. "The Textility of Making." Cambridge Journal of Economics 34, no. 1: 91-102. https://doi-org.proxy.library.upenn.edu/10.1093/cje/bep042.

Johnson, Khari. 2020. "Ruha Benjamin on Deep Learning: Computational Depth without Sociological Depth Is 'Superficial Learning.'" Venture Beat, April 29, 2020. https://venturebeat.com/2020/04/29/ruha-benjamin-on-deep-learning-computational-depth-without-sociological-depth-is-superficial-learning/.

Johnson, Lisa M., and Rosemary A. Joyce, eds. 2022. Materializing Ritual Practices. Boulder, CO: University Press of Colorado.

Judd, Bettina. 2019. "GLOSSOLALIA: Lucille Clifton's Creative Technologies of Becoming." In Black Bodies and Transhuman Realities: Scientifically Modifying the Black Body in Posthuman Literature and Culture, edited by Melvin G. Hill, 133-49. Lanham, MD: Lexington Books.

Jungnickel, Kat. 2018. Bikes and Bloomers: Victorian Women Inventors and Their Extraordinary Cycle Wear. Cambridge, MA: MIT Press.

Jungnickel, Kat, ed. 2020. Transmissions: Critical Tactics for Making and Communicating Research. Cambridge, MA: MIT Press.

Kendzior, Sarah. 2015. "Ferguson's Radical Knitters: 'If Someone Asks Me What I'm Doing, I Say, "I'm Knitting for Black Liberation."'" The Guardian, August 6, 2015. https://www.theguardian.com/us-news/2015/aug/06/ferguson-radical-knitters-talk-justice-race-issues.

Koiki, Bukoka. n.d. Bukoka Koiki (website). Accessed June 27, 2022. https://www.bukolakoiki.com.

Kuchler, Susanne, and Timothy Carroll. 2020. A Return to the Object: Alfred Gell, Art, and Social Theory. Oxford: Routledge.

Kuroki, Toshihiko, and Mitsuhiro Sowa. 1991. "Investigation of Leakage Phenomena and Improvement of Sealing Performance of Seal Rings Used for Rotary Shafts of Automatic Transmissions." SAE Technical Paper 910534. https://doi-org.proxy.library.upenn.edu/10.4271/910534.

Lean In. n.d. "What We Do." Accessed June 27, 2022. https://leanin.org/about.

Lee, Nicol Turner. 2018. "Detecting Racial Bias in Algorithms and Machine Learning." Journal of Information, Communication and Ethics in Society 16 (3). https://doi-org.proxy.library.upenn.edu/10.1108/JICES-06-2018-0056.

Loveless, Natalie. 2019. How to Make Art at the End of the World. Durham, NC: Duke University Press.

Lustig, Caitlin, and Daniela K. Rosner. 2022. "From Explainability to Ineffability? ML Tarot and the Possibility of Inspiriting Design." In Designing Interactive Systems Conference, 123-136. https://doi-org.proxy.library.upenn.edu/10.1145/3532106.3533552.

Menabrea, Luigi Federico. 1843. "Sketch of the Analytical Engine Invented by Charles Babbage, Esq." Translated by Ada Lovelace, in Scientific Memoirs, Selected from the Transactions of Foreign Academies of Science and Learned Societies, and from Foreign Journals, vol. 3, edited by Richard Taylor, 666-731. London: Richard and John E. Taylor.

Minh-Ha, Trinh T. 2014. When the Moon Waxes Red: Representation, Gender and Cultural Politics. New York: Routledge.

Minister, Meredith. 2012. "Female, Black, and Able: Representations of Sojourner Truth and Theories of Embodiment." Disability Studies Quarterly 32 (1). https://dsq-sds.org/article/view/3030/3057.

Monteiro, Stephen. 2017. The Fabric of Interface: Mobile Media, Design, and Gender. Cambridge, MA: MIT Press.

Morrison, Romi Ron. 2018. "Rituals of Black Fugitivity" (website). Accessed June 27, 2022. https://elegantcollisions.com/rituals-of-black-fugativity-p.

Moten, Fred. 2003. In the Break: The Aesthetics of the Black Radical Tradition. Minneapolis: University of Minnesota Press.

Naclerio, Nicholas D., and Elliot W. Hawkes. 2020. "Simple, Low-Hysteresis, Foldable, Fabric Pneumatic Artificial Muscle." IEEE Robotics and Automation Letters 5 (2): 3406-13. https://doi-org.proxy.library.upenn.edu/10.1109/LRA.2020.2976309.

Nakamura, Lisa. 2014. "Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture." American Quarterly 66, no. 4: 919-941. https://doi-org.proxy.library.upenn.edu/10.1353/aq.2014.0070.

Noble, Safiya U. 2018. Algorithms of Oppression. New York University Press.

--. 2021. "The Logics of (Digital) Distortion." Interactions 28 (6): 41-45. https://doi-org.proxy.library.upenn.edu/10.1145/3490321.

OED Online. 2021. S.v. "bias." Accessed November 29, 2021.

O'Gorman, Marcel. 2021. Making Media Theory: Thinking Critically with Technology. London: Bloomsbury Publishing.

Park, Sun Young, Pei-Yi Kuo, Andrea Barbarin, Elizabeth Kaziunas, Astrid Chow, Karandeep Singh, Lauren Wilcox, and Walter S. Lasecki. 2019. "Identifying Challenges and Opportunities in Human-AI Collaboration in Healthcare." CSCW '19: Conference Companion Publication of the 2019 on Computer Supported Cooperative Work and Social Computing, November 2019, 506-10. https://doi-org.proxy.library.upenn.edu/10.1145/3311957.3359433.

Perez-Bustos, Tania, Eliana Sanchez-Aldana, and Alexandra Choconta-Piraquive.

2019. "Textile Material Metaphors to Describe Feminist Textile Activisms: From Threading Yarn, to Knitting, to Weaving Politics." Textile 17 (4): 368-77. https://doi-org.proxy.library.upenn.edu/10.1080/14759756.2019.1639417.

Plemons, Eric. Opening Comments. Conference: "What's New About 'New Materialisms?'" May 2012 in Berkeley, California.

Psarra, Afroditi, Daniela Rosner, and Gabrielle Benabdallah. 2021. On the Bias (course website). University of Washington. https://www.onthebiasthinking.com/

Rico, Laura, Cortes, Jaime Patarroyo, Tania Perez-Bustos, and Eliana Sanchez Aldana.

2020. "How Can Digital Textiles Embody Testimonies of Reconciliation?" In PDC '20: Proceedings of the 16th Participatory Design Conference 2020, Participation(s) Otherwise, vol. 2, 109-13. https://doi-org.proxy.library.upenn.edu/10.1145/3384772.3385137.

Rogers, Hannah Star. 2022. Art, Science, and the Politics of Knowledge. Cambridge, MA: MIT Press.

Rogers, Hannah Star, Megan K. Halpern, Dehlia Hannah, and Kathryn de Ridder-Vignone, eds. 2021. Routledge Handbook of Art, Science, and Technology Studies. New York: Routledge.

Rosner, Daniela K. 2018. Critical Fabulations: Reworking the Methods and Margins of Design. Cambridge, MA: MIT Press.

Rosner, Daniela K., Samantha Shorey, Brock R. Craft, and Helen Remick. 2018. "Making Core Memory: Design Inquiry into Gendered Legacies of Engineering and Craftwork." CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, April 2018, paper no. 531, 1-13. https://doi-org.proxy.library.upenn.edu/10.1145/3173574.3174105.

Sherman, Jihan, Takeria Blunt, and Patrick Fiorilli. 2019. "Telling the Bees: Designing for Immersion, Mediation, and Ritual." In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 391-398.

Shorey, Samantha, and Daniela K. Rosner. 2019. "A Voice of Process: Re-presencing the Gendered Labor of Apollo Innovation." communication+ 1 7 (2): 4. https://doi-org.proxy.library.upenn.edu/10.7275/yen8-qn18.

Smith, Hinekura. 2019. "Whatuora: Theorizing 'New' Indigenous Research Methodology from 'Old' Indigenous Weaving Practice." Art/Research International: A Transdisciplinary Journal 4 (1): 1-27. https://doi-org.proxy.library.upenn.edu/10.18432/ari29393.

Stoler, Ann Laura. 2010. Along the Archival Grain: Epistemic Anxieties and Colonial Common Sense. Princeton, NJ: Princeton University Press, 2010.

Sutton, Barbara, and Nayla Luz Vacarezza. 2020. "Abortion Rights in Images: Visual Interventions by Activist Organizations in Argentina." Signs: Journal of Women in Culture and Society 45 (3): 731-57. https://doi-org.proxy.library.upenn.edu/10.1086/706489.

TallBear, Kim. 2017. "Beyond the Life/Not-Life Binary: A Feminist-Indigenous Reading of Cryopreservation, Interspecies Thinking, and the New Materialisms." In Cryopolitics: Frozen Life in a Melting World, edited by Joanna Radin and Emma Kowal, 179-202. Cambridge, MA: MIT Press.

Tavris, Carol. 2021. "Bias Cut: How Best to Achieve Diversity in the Workplace." TLS: Times Literary Supplement 6186 (October 21, 2021): 14-15. https://www-the-tls-co-uk.proxy.library.upenn.edu/articles/bias-interrupted-joan-c-williams-book-review-carol-tavris/.

Tobin, Jacqueline and Raymond G. Dobard Tobin. 1999. Hidden in Plain View: A Secret Story of Quilts and the Underground Railroad. New York, NY: Doubleday.

Urton, Gary, and Alejandro Chu. 2019. "The Invention of Taxation in the Inka Empire." Latin American Antiquity 30 (1): 1-16. https://doi-org.proxy.library.upenn.edu/10.1017/laq.2018.64.

Varghese, Nirmala, and G. Thilagavathi. 2012. "Fit of Sari Blouse: Influencing Parameters and Assessment." Journal of Textile and Apparel, Technology and Management 7 (4): 1-19. https://ojs.cnr.ncsu.edu/index.php/JTATM/article/view/2995.

Wernimont, Jacqueline. 2019. Numbered Lives: Life and Death in Quantum Media. Cambridge, MA: MIT Press.

Daniela K. Rosner

Department of Human Centered Design and Engineering, University of Washington

dkrosner@uw.edu

Rosner, Daniela. 2022. "The Bias Cut: Toward a Technopoetics of Algorithmic Systems." Catalyst: Feminism, Theory, Technoscience 8 (2): 1-21.

Author Bio

Daniela Rosner is an Associate Professor in Human Centered Design & Engineering (HCDE) at the University of Washington and co-director of the Tactile and Tactical Design (TAT) Lab. She has published on the social, political, and material circumstances of technology development and use, with a longstanding interest in sites of innovation such as electronics maintenance and needlecraft historically overlooked within Western engineering cultures. Rosner serves as an Editor-in-Chief of Interactions magazine, a bimonthly publication of ACM SIGCHI.

Source Citation

Source Citation   
Rosner, Daniela K. "The Bias Cut: Toward a Technopoetics of Algorithmic Systems." Catalyst: Feminism, Theory, Technoscience, vol. 8, no. 2, fall 2022, pp. 1s+. Gale Academic OneFile, link.gale.com/apps/doc/A727810270/AONE?u=upenn_main&sid=summon&xid=0fd1ee2e. Accessed 1 Mar. 2023.
Data elements have been formatted to meet the latest citation standards. These citations are not a replacement for the latest guidebooks or your instructor's requirements. Double-check capitalization, dates, and names and make any necessary corrections.
*The RIS file format can be used with EndNote, ProCite, Reference Manager, and Zotero.

Gale Document Number: GALE|A727810270

More Like This

International journal of communication (Online), Jan. 2021.